Encryption Plans for Facebook & Instagram are Delayed by 2023
The official parent company of Facebook, Meta has now declared that the encryption on the following apps would come in 2023.
Only the sender and receiver can read the messages, but neither police enforcement nor Meta can.
Child advocacy groups and lawmakers, on the other hand, have cautioned that it could make it more difficult for police to investigate child abuse.
According to the National Society for the Prevention of Cruelty to Children (NSPCC), Private texting is considered the very first line of child sexual abuse.
Priti Patel, the UK Home Secretary, has also criticized the technology, stating earlier this year that it might 'severely impede' law enforcement efforts to combat illegal activities, such as online child abuse.
Privacy or Protection?
End-to-end encryption secures the data by scrambling or encrypting it as it carries over different mediums on the sending and receiving devices.
Getting physical access to an unlocked device that transmitted or received the message is usually the only way to read it.
The technology is used by Meta's popular messaging program WhatsApp, but not by the rest of the company's products.
Last year, the NSPCC issued Freedom of Information requests to 46 police forces in England, Wales, and Scotland, requesting a breakdown of the platforms used to conduct sexual offenses against minors.
Getting it Right
Meta's worldwide head of safety, Antigone Davis, explained that the delay in introducing encryption until 2023 was due to the company's desire to 'get this right.'
She also listed a number of additional precautionary steps that the organization has already implemented, such as:
1. 'Proactive detection technology' checks for unusual patterns of behavior, such as a person who creates new profiles frequently or sends messages to a big number of people they don't know.
2. By default, under-18 users are placed in secret or 'friends only' profiles, and adults are barred from messaging them if they aren't already connected.
3. Using in-app suggestions to teach young people how to prevent unpleasant interactions.
The NSPCC's head of child safety online strategy, Andy Burrows, applauded Meta's decision.
'They should only go forward with these steps if they can demonstrate that they have the technologies in place to ensure that children are not at increased danger of abuse,' he further added.
Lastly, he added, 'Facebook must now show they are serious about the kid safety hazards and not just playing for time as they weather difficult headlines, more than 18 months after an NSPCC-led worldwide coalition of 130 child protection organizations raised the alarm about the dangers of end-to-end encryption.'
Also, Read Popular Privacy and Security Features for Mobile Devices